Some Modifications to Calculate Regression Coefficients in Multiple Linear Regression
Authors
Abstract:
In a multiple linear regression model, there are instances where one has to update the regression parameters. In such models as new data become available, by adding one row to the design matrix, the least-squares estimates for the parameters must be updated to reflect the impact of the new data. We will modify two existing methods of calculating regression coefficients in multiple linear regression to make the computations more efficient. By resorting to an initial solution, we first employ the Sherman-Morrison formula to update the inverse of the transpose of the design matrix multiplied by the design matrix. We then modify the calculation of the product of the transpose of design matrix and the design matrix by the Cholesky decomposition method to solve the system. Finally, we compare these two modifications by several appropriate examples.
similar resources
Linear Regression Under Multiple
This dissertation studies the least squares estimator of a trend parameter in a simple linear regression model with multiple changepoints when the changepoint times are known. The error component in the model is allowed to be autocorrelated. The least squares estimator of the trend and the variance of the trend estimator are derived. Consistency and asymptotic normality of the trend estimator a...
full textFuzzy linear regression analysis with trapezoidal coefficients
In this paper, we aim to extended the constraints of Tanaka’s model. Applied coefficients of the fuzzy regression by them is the symmetric triangular fuzzy numbers, while we try to replace it by more general asymmetric trapezoidal one. Possibility of two asymmetric trapezoidal fuzzy numbers is explained by possibility distribution. Two different models is presented and a numerical example is gi...
full textInterval linear regression
In this paper, we have studied the analysis an interval linear regression model for fuzzy data. In section one, we have introduced the concepts required in this thesis and then we illustrated linear regression fuzzy sets and some primary definitions. In section two, we have introduced various methods of interval linear regression analysis. In section three, we have implemented nu...
full textMultiple Linear Regression Models in Outlier Detection
Identifying anomalous values in the realworld database is important both for improving the quality of original data and for reducing the impact of anomalous values in the process of knowledge discovery in databases. Such anomalous values give useful information to the data analyst in discovering useful patterns. Through isolation, these data may be separated and analyzed. The analysis of outlie...
full textMultiple testing in high-dimensional linear regression
In many real-world statistical problems, we observe a large number of potentially explanatory variables of which a majority may be irrelevant. For this type of problem, controlling the false discovery rate (FDR) guarantees that most of the discoveries are truly explanatory and thus replicable. In this talk, we propose a new method named SLOPE to control the FDR in sparse high-dimensional linear...
full textMy Resources
Journal title
volume 2 issue 1
pages 75- 86
publication date 2008-04-01
By following a journal you will be notified via email when a new issue of this journal is published.
Hosted on Doprax cloud platform doprax.com
copyright © 2015-2023